79 research outputs found

    Changes in the seawater salinity-oxygen isotope relation between last glacial and present: sediment core data and OGCM modelling

    Get PDF
    The presently available paleotemperature data implies large ice-free areas in the Greenland- Iceland-Norwegian Seas during the Last Glacial Maximum 21 600 yr BP. From these temperatures and the independent measurements of oxygen isotope ratios of fossil foraminiferal shells, glacial sea surface salinities could be computed, if the glacial relation between salinity and water isotope ratio was known. For this study, a three-dimensional numerical ocean circulation model was employed to investigate the possible shape of this still not precisely known relation, and to reconstruct a physically consistent scenario of the northern North Atlantic for the glacial summer. This scenario turned out to be quite similar to modern winter conditions, whereas the required salinity vs. oxygen isotope relation of this time must have been very different from its modern counterpart

    Numerische Modellierung der PalÀo-Ozeanographie des Glazialen EuropÀischen Nordmeers

    Get PDF

    Geostatistical interpretation of paleoceanographic data over large ocean basins - Reality and fiction

    Get PDF
    A promising approach to reconstruct oceanographic scenarios of past time slices is to drive numerical ocean circulation models with sea surface temperatures, salinities, and ice distributions derived from sediment core data. Set up properly, this combination of boundary conditions provided by the data and physical constraints represented by the model can yield physically consistent sets of three-dimensional water mass distribution and circulation patterns. This idea is not only promising but dangerous, too. Numerical models cannot be fed directly with data from single core locations distributed unevenly and, as it is the common case, scarcely in space. Conversely, most models require forcing data sets on a regular grid with no missing points, and some method of interpolation between punctual source data and model grid has to be employed. An ideal gridding scheme must retain as much of the information present in the sediment core data while generating as few artifacts in the interpolated field as possible. Based on a set of oxygen isotope ratios, we discuss several standard interpolation strategies, namely nearest neighbour schemes, bicubic splines, Delaunay triangulation, and ordinary and indicator kriging. We assess the gridded fields with regard to their physical consistence and their implications for the oceanic circulation

    Newsletter of the Digital Earth Project Contributions of the Alfred Wegener Institute to Digital Earth

    Get PDF
    As an important technical pillar of Digital Earth AWI computing centre provides data management and cloud processing services to the project partners. We develop project specific extensions to the AWI data flow framework O2A (Observation to Archive). Sensor registration in O2A will support a flexible handling of sensors and their metadata, e.g. for the Digital Earth showcases, methane and soil moisture measurements are in focus for smart monitoring designs and for the access to data in near real time (NRT). Furthermore, data exploration is supported by a rasterdata manager service that can be easily coupled in user ́s data workflows with other data sources, like NRT sensor data. In the following we give more details on O2A, its components and concept

    ULTRAMASSEXPLORER: A BROWSER-BASED APPLICATION FOR THE EVALUATION OF HIGH RESOLUTION MASS SPECTROMETRIC DATA

    Get PDF
    In the evaluation of high-resolution mass spectrometric data a considerable amount of time and computational power can be spent on matching molecular formulas to the neutral mass of measured ions. During the evaluation of multiple samples using the classical combinatory approach based on molecular building blocks and nested loops, the time consuming step of calculating the molecular mass may be repeated for the same molecular formula multiple times. To avoid repetitive calculations, we implemented a formula library based search approach into our data evaluation pipeline. In our approach, the step of calculating molecular formulas and corresponding masses is limited to the process of building a library. The library calculation requires an a priori definition of the maximum molecular mass and the isotopes contained, e.g. formulas in the mass range of ≀ 650 Da consisting of 12C, 1H, 14N, 16O, 31P, 32S, 13C, and 34S. The subsequent matching process is based on scrolling through a mass-sorted formula library and comparison with a mass-sorted list of measured peaks. The time required for processing is primarily a function of the size of the formula library. Consequently, at constant library size, the matching algorithm becomes more efficient with increasing number of supplied peaks (up to 4700 formula assignments s-1 on a standard workstation) and is thus particularly suited for processing large datasets. We implemented the matching algorithm into our R Shiny based interactive, evaluation software UltraMassExplorer (UME). In combination with the graphical user interface of UME, our algorithm provides the basis for fast and reproducible (re-)analysis of complete sample sets with currently up to 400,000 peaks in a user friendly, integrated environment. The code of our open-source algorithm is available through the UME website [1]. References [1] www.awi.de/en/um

    O2A - Data Flow Framework from Sensor Observations to Archives

    Get PDF
    The Alfred Wegener Institute coordinates German polar research and is one of the most productive polar research institutions worldwide with scientists working in both Polar Regions – a task that can only be successful with the help of excellent infrastructure and logistics. Conducting research in the Arctic and Antarctic requires research stations staffed throughout the year as the basis for expeditions and data collection. It needs research vessels, aircrafts and long-term observatories for large-scale measurements as well as sophisticated technology. In this sense, the AWI also provides this infrastructure and competence to national and international partners. To meet the challenge the AWI has been progressively developing and sustaining an e-Infrastructure for coherent discovery, visualization, dissemination and archival of scientific information and data. Most of the data originates from research activities being carried out in a wide range of sea-, airand land-based operating research platforms. Archival and publishing in PANGAEA repository along with DOI assignment to individual datasets is a pursued end-of-line step. Within AWI, a workflow for data acquisition from vessel-mounted devices along with ingestion procedures for the raw data into the institutional archives has been well established. However, the increasing number of ocean-based stations and respective sensors along with heterogeneous project-driven requirements towards satellite communication, sensor monitoring, quality control and validation, processing algorithms, visualization and dissemination has recently lead us to build a more generic and cost-effective framework, hereafter named O2A (observations to archives). The main strengths of our framework (https://www.awi.de/en/data-flow) are the seamless flow of sensor observation to archives and the fact that it complies with internationally used OGC standards and assuring interoperability in international context (e.g. SOS/SWE, WMS, WFS, etc.). O2A comprises several extensible and exchangeable modules (e.g. controlled vocabularies and gazetteers, file type and structure validation, aggregation solutions, processing algorithms, etc.) as well as various interoperability services. We are providing integrated tools for standardized platform, device and sensor descriptions following SensorML (https://sensor.awi.de), automated near-real time and “big data” data streams supporting SOS and O&M and dashboards allowing data specialists to monitor their data streams for trends and early detection of malfunction of sensors (https://dashboard.awi.de). Also in the context of the “Helmholtz Data Federation” with outlook towards the European Open Science Cloud we are developing a cloud-based workspace providing user-friendly solutions for data storage on petabyte-scale and state-of-the-art computing solutions (Hadoop, Spark, Notebooks, rasdaman, etc.) to support scientists in collaborative data analysis and visualization activities including geo-information systems (http://maps.awi.de). Our affiliated repositories offer archival and long-term preservation as well as publication solutions for data, data products, publications, presentations and field reports (https://www.pangaea.de, https://epic.awi.de)

    Automatic data quality control for understanding extreme climate event

    Get PDF
    The understanding of extreme events strongly depends on knowledge gained from data. Data integration of mul-tiple sources, scales and earth compartments is the fo-cus of the project Digital Earth, which also join efforts on the quality control of data. Automatic quality control is embedded in the ingest component of the O2A, the ob-servation-to-archive data flow framework of the Alfred-Wegener-Institute. In that framework, the O2A-Sensor provides observation properties to the O2A-Ingest, which delivers quality-flagged data to the O2A-dash-board. The automatic quality control currently follows a procedural approach, where modules are included to implement formulations found in the literature and other operational observatory networks. A set of plausibility tests including range, spike and gradient tests are cur-rently operational. The automatic quality control scans the ingesting data in near-real-time (NRT) format, builds a table of devices, and search - either by absolute or derivative values - for correctness and validity of obser-vations. The availability of observation properties, for in-stance tests parameters like physical or operation ranges, triggers the automatic quality control, which in turn iterates through the table of devices to set the qual-ity flag for each sample and observation. To date, the quality flags in use are sequential and qualitative, i.e. it describes a level of quality in the data. A new flagging system is under development to include a descriptive characteristic that will comprise technical and user inter-pretation. Within Digital Earth, data on flood and drought events along the Elbe River and methane emissions in the North Sea are to be reviewed using automatic qual-ity control. Fast and scalable automatic quality control will disentangle uncertainty raised by quality issues and thus improve our understanding of extreme events in those cases

    Permafrost-related research data - their accessibility, visualization, and publication using GIS and WebGIS technology

    Get PDF
    Permafrost regions are highly sensitive to climate changes. The monitoring of key variables and identification of relevant-processes is of topmost importance in these environments. ESA DUE GlobPermafrost (www.globpermafrost.info) provides a remote sensing data service for permafrost research and applications. This service was extended by permafrost modelling (time series), implemented in the new ESA CCI+ Permafrost project (2018-2021). The service comprises of the generation of remote sensing products for various regions and spatial scales as well as the specific infrastructures for visualisation, dissemination and access to datasets - PerSys. PerSys is the ESA GlobPermafrost geospatial information service for publishing and visualisation of information and data products to the public. Data products are described and searchable in the PerSys data catalogue (apgc.awi.de), and data visualisation employs the AWI WebGIS-infrastructure maps@awi (http://maps.awi.de), a highly scalable data visualisation unit within the AWI data-workflow framework O2A, from Observation to Archive. maps@awi WebGIS technology supports the project-specific visualisation of raster and vector data products of any spatial resolution and remote sensing origin. This is a prerequisite for the visualisation of the wide range of GlobPermafrost remote sensing products like: Landsat multispectral index trends (Tasseled Cap Brightness, Greeness, Wetness; Normalized Vegetation Index NDVI), Arctic land cover (e.g. shrub height, vegetation composition), lake ice grounding, InSAR-based land surface deformation, rock glacier velocities and a spatially distributed permafrost model output with permafrost probability and ground temperature per pixel. We established several WebGIS projects for the adaption to products specific spatial scales. For example, the WebGIS ‘Arctic’ visualises the Circum-Artic products. Highly resolved data products for rock glacier movements are visualised on regional scales in the WebGIS projects ‘Alps’, ‘Andes’ or ‘Central Asia’. The PerSYS WebGIS also visualises the stations of the WMO GCOS ground monitoring networks of the permafrost community: the Global Terrestrial Network for Permafrost GTN-P managed by the International Permafrost Association IPA. The PerSYS WebGIS has been continuously adapted in close co-operation with user at user workshops and at conferences and the International Permafrost Association (IPA)
    • 

    corecore